42 research outputs found
Generative Models and Learning Algorithms for Core-Periphery Structured Graphs
We consider core-periphery structured graphs, which are graphs with a group
of densely and sparsely connected nodes, respectively, referred to as core and
periphery nodes. The so-called core score of a node is related to the
likelihood of it being a core node. In this paper, we focus on learning the
core scores of a graph from its node attributes and connectivity structure. To
this end, we propose two classes of probabilistic graphical models: affine and
nonlinear. First, we describe affine generative models to model the dependence
of node attributes on its core scores, which determine the graph structure.
Next, we discuss nonlinear generative models in which the partial correlations
of node attributes influence the graph structure through latent core scores. We
develop algorithms for inferring the model parameters and core scores of a
graph when both the graph structure and node attributes are available. When
only the node attributes of graphs are available, we jointly learn a
core-periphery structured graph and its core scores. We provide results from
numerical experiments on several synthetic and real-world datasets to
demonstrate the efficacy of the developed models and algorithms
Sampling and Recovery of Signals on a Simplicial Complex using Neighbourhood Aggregation
In this work, we focus on sampling and recovery of signals over simplicial
complexes. In particular, we subsample a simplicial signal of a certain order
and focus on recovering multi-order bandlimited simplicial signals of one order
higher and one order lower. To do so, we assume that the simplicial signal
admits the Helmholtz decomposition that relates simplicial signals of these
different orders. Next, we propose an aggregation sampling scheme for
simplicial signals based on the Hodge Laplacian matrix and a simple least
squares estimator for recovery. We also provide theoretical conditions on the
number of aggregations and size of the sampling set required for faithful
reconstruction as a function of the bandwidth of simplicial signals to be
recovered. Numerical experiments are provided to show the effectiveness of the
proposed method
Fast Graph Convolutional Recurrent Neural Networks
This paper proposes a Fast Graph Convolutional Neural Network (FGRNN)
architecture to predict sequences with an underlying graph structure. The
proposed architecture addresses the limitations of the standard recurrent
neural network (RNN), namely, vanishing and exploding gradients, causing
numerical instabilities during training. State-of-the-art architectures that
combine gated RNN architectures, such as Long Short-Term Memory (LSTM) and
Gated Recurrent Unit (GRU) with graph convolutions are known to improve the
numerical stability during the training phase, but at the expense of the model
size involving a large number of training parameters. FGRNN addresses this
problem by adding a weighted residual connection with only two extra training
parameters as compared to the standard RNN. Numerical experiments on the real
3D point cloud dataset corroborates the proposed architecture.Comment: 5 pages.Submitted to Asilomar Conference on Signals, Systems, and
Computer